1,802 research outputs found
A spatial version of the It\^{o}-Stratonovich correction
We consider a class of stochastic PDEs of Burgers type in spatial dimension
1, driven by space-time white noise. Even though it is well known that these
equations are well posed, it turns out that if one performs a spatial
discretization of the nonlinearity in the "wrong" way, then the sequence of
approximate equations does converge to a limit, but this limit exhibits an
additional correction term. This correction term is proportional to the local
quadratic cross-variation (in space) of the gradient of the conserved quantity
with the solution itself. This can be understood as a consequence of the fact
that for any fixed time, the law of the solution is locally equivalent to
Wiener measure, where space plays the role of time. In this sense, the
correction term is similar to the usual It\^{o}-Stratonovich correction term
that arises when one considers different temporal discretizations of stochastic
ODEs.Comment: Published in at http://dx.doi.org/10.1214/11-AOP662 the Annals of
Probability (http://www.imstat.org/aop/) by the Institute of Mathematical
Statistics (http://www.imstat.org
A critical analysis of the Nigerian entrepreneurial ecosystem on transformational entrepreneurship
Previous research suggests that the entrepreneurial ecosystems (EEs) support factors are critical for the systemic development of micro, small. and medium enterprises (MSMEs). However, there is limited understanding of how MSME owner's/manager’s characteristics impact their perspectives of the EE support factors for business development. This study addresses this research gap to adopt a coherent approach to understand the EE in supporting MSMEs to achieve transformational entrepreneurship (TE), which builds sustainable businesses for long-term societal benefits. The MSMEs characteristics alongside the EE factors were tested with 576 MSMEs in Nigeria. The MSME owner's/manager’s characteristics were positively correlated to their perspective of EE support factors (access to finance, markets, resources, and policies and regulations), which were inadequate. The findings will assist theory and practice development to understand and focus on the EE discussed in the context of TE in Nigeria, offering potential insights for similar developing economies.
</p
The evolution of student identity: A case of caveat emptor
Engaging students has been seen as the key to promoting their achievement in higher education institutions. However, there is an important stage prior to this: the development of a positive student identity which influences students’ motivation to engage. As the student body has evolved from full-time, on-campus students entering university straight from school to embrace adult, part-time and online learners who are also in employment, the transition to a student identity has become less transparent. To encourage part-time students undertaking an undergraduate degree in Social and Health Care Management to engage with each other, the course team piloted peer assessment within the programme for a year. This paper informs the debate by providing insight into the students’ approach to learning and attainment. It is argued that the culture of compliance and the technocratic approach to task completion increasingly required within the social care and learning sector is antithetical to deep learning. For students to make the transition they need to commit to a student identity in which participation in reflection and critical debate are valued. The challenge is for universities to enable this by addressing the barriers and stimulating a positive identity for non-traditional students
Rethinking Storage Management for Data Processing Pipelines in Cloud Data Centers
Data processing frameworks such as Apache Beam and Apache Spark are used for
a wide range of applications, from logs analysis to data preparation for DNN
training. It is thus unsurprising that there has been a large amount of work on
optimizing these frameworks, including their storage management. The shift to
cloud computing requires optimization across all pipelines concurrently running
across a cluster. In this paper, we look at one specific instance of this
problem: placement of I/O-intensive temporary intermediate data on SSD and HDD.
Efficient data placement is challenging since I/O density is usually unknown at
the time data needs to be placed. Additionally, external factors such as load
variability, job preemption, or job priorities can impact job completion times,
which ultimately affect the I/O density of the temporary files in the workload.
In this paper, we envision that machine learning can be used to solve this
problem. We analyze production logs from Google's data centers for a range of
data processing pipelines. Our analysis shows that I/O density may be
predictable. This suggests that learning-based strategies, if crafted
carefully, could extract predictive features for I/O density of temporary files
involved in various transformations, which could be used to improve the
efficiency of storage management in data processing pipelines
Distilling the Real Cost of Production Garbage Collectors
Abridged abstract: despite the long history of garbage collection (GC) and
its prevalence in modern programming languages, there is surprisingly little
clarity about its true cost. Without understanding their cost, crucial
tradeoffs made by garbage collectors (GCs) go unnoticed. This can lead to
misguided design constraints and evaluation criteria used by GC researchers and
users, hindering the development of high-performance, low-cost GCs. In this
paper, we develop a methodology that allows us to empirically estimate the cost
of GC for any given set of metrics. By distilling out the explicitly
identifiable GC cost, we estimate the intrinsic application execution cost
using different GCs. The minimum distilled cost forms a baseline. Subtracting
this baseline from the total execution costs, we can then place an empirical
lower bound on the absolute costs of different GCs. Using this methodology, we
study five production GCs in OpenJDK 17, a high-performance Java runtime. We
measure the cost of these collectors, and expose their respective key
performance tradeoffs. We find that with a modestly sized heap, production GCs
incur substantial overheads across a diverse suite of modern benchmarks,
spending at least 7-82% more wall-clock time and 6-92% more CPU cycles relative
to the baseline cost. We show that these costs can be masked by concurrency and
generous provisioning of memory/compute. In addition, we find that newer
low-pause GCs are significantly more expensive than older GCs, and,
surprisingly, sometimes deliver worse application latency than stop-the-world
GCs. Our findings reaffirm that GC is by no means a solved problem and that a
low-cost, low-latency GC remains elusive. We recommend adopting the
distillation methodology together with a wider range of cost metrics for future
GC evaluations.Comment: Camera-ready versio
Comorbidity and Quality of Life in Adults with Hair Pulling Disorder
Hair pulling disorder (HPD; trichotillomania) is thought to be associated with significant psychiatric comorbidity and functional impairment. However, few methodologically rigorous studies of HPD have been conducted, rendering such conclusions tenuous. The following study examined comorbidity and psychosocial functioning in a well-characterized sample of adults with HPD (N=85) who met DSM-IV criteria, had at least moderate hair pulling severity, and participated in a clinical trial. Results revealed that 38.8% of individuals with HPD had another current psychiatric diagnosis and 78.8% had another lifetime (present and/or past) psychiatric diagnosis. Specifically, HPD showed substantial overlap with depressive, anxiety, addictive, and other body-focused repetitive behavior disorders. The relationships between certain comorbidity patterns, hair pulling severity, current mood and anxiety symptoms, and quality of life were also examined. Results showed that current depressive symptoms were the only predictor of quality of life deficits. Implications of these findings for the conceptualization and treatment of HPD are discussed
Efficient Online Surface Correction for Real-time Large-Scale 3D Reconstruction
State-of-the-art methods for large-scale 3D reconstruction from RGB-D sensors
usually reduce drift in camera tracking by globally optimizing the estimated
camera poses in real-time without simultaneously updating the reconstructed
surface on pose changes. We propose an efficient on-the-fly surface correction
method for globally consistent dense 3D reconstruction of large-scale scenes.
Our approach uses a dense Visual RGB-D SLAM system that estimates the camera
motion in real-time on a CPU and refines it in a global pose graph
optimization. Consecutive RGB-D frames are locally fused into keyframes, which
are incorporated into a sparse voxel hashed Signed Distance Field (SDF) on the
GPU. On pose graph updates, the SDF volume is corrected on-the-fly using a
novel keyframe re-integration strategy with reduced GPU-host streaming. We
demonstrate in an extensive quantitative evaluation that our method is up to
93% more runtime efficient compared to the state-of-the-art and requires
significantly less memory, with only negligible loss of surface quality.
Overall, our system requires only a single GPU and allows for real-time surface
correction of large environments.Comment: British Machine Vision Conference (BMVC), London, September 201
- …